skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Wang, Ruogu"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Abstract The accuracy of assigning fluorophore identity and abundance, known as spectral unmixing, in biological fluorescence microscopy images remains a significant challenge due to the substantial overlap in emission spectra among fluorophores. In traditional laser scanning confocal spectral microscopy, fluorophore information is acquired by recording emission spectra with a single combination of discrete excitation wavelengths. However, organic fluorophores possess characteristic excitation spectra in addition to their unique emission spectral signatures. In this paper, we propose a generalized multi-view machine learning approach that leverages both excitation and emission spectra to significantly improve the accuracy in differentiating multiple highly overlapping fluorophores in a single image. By recording emission spectra of the same field with multiple combinations of excitation wavelengths, we obtain data representing different views of the underlying fluorophore distribution in the sample. We then propose a multi-view machine learning framework that allows for the flexible incorporation of noise information and abundance constraints, enabling the extraction of spectral signatures from reference images and efficient recovery of corresponding abundances in unknown mixed images. Numerical experiments on simulated image data demonstrate the method’s efficacy in improving accuracy, allowing for the discrimination of 100 fluorophores with highly overlapping spectra. Furthermore, validation on images of mixtures of fluorescently labeled Escherichia coli highlights the power of the proposed multi-view strategy in discriminating fluorophores with spectral overlap in real biological images. 
    more » « less
  2. Abstract MotivationMultispectral biological fluorescence microscopy has enabled the identification of multiple targets in complex samples. The accuracy in the unmixing result degrades (i) as the number of fluorophores used in any experiment increases and (ii) as the signal-to-noise ratio in the recorded images decreases. Further, the availability of prior knowledge regarding the expected spatial distributions of fluorophores in images of labeled cells provides an opportunity to improve the accuracy of fluorophore identification and abundance. ResultsWe propose a regularized sparse and low-rank Poisson regression unmixing approach (SL-PRU) to deconvolve spectral images labeled with highly overlapping fluorophores which are recorded in low signal-to-noise regimes. First, SL-PRU implements multipenalty terms when pursuing sparseness and spatial correlation of the resulting abundances in small neighborhoods simultaneously. Second, SL-PRU makes use of Poisson regression for unmixing instead of least squares regression to better estimate photon abundance. Third, we propose a method to tune the SL-PRU parameters involved in the unmixing procedure in the absence of knowledge of the ground truth abundance information in a recorded image. By validating on simulated and real-world images, we show that our proposed method leads to improved accuracy in unmixing fluorophores with highly overlapping spectra. Availability and implementationThe source code used for this article was written in MATLAB and is available with the test data at https://github.com/WANGRUOGU/SL-PRU. 
    more » « less